In translation, considering the document as a whole can help to resolveambiguities and inconsistencies. In this paper, we propose a cross-sentencecontext-aware approach and investigate the influence of historical contextualinformation on the performance of neural machine translation (NMT). First, thishistory is summarized in a hierarchical way. We then integrate the historicalrepresentation into NMT in two strategies: 1) a warm-start of encoder anddecoder states, and 2) an auxiliary context source for updating decoder states.Experimental results on a large Chinese-English translation task show that ourapproach significantly improves upon a strong attention-based NMT system by upto +2.1 BLEU points.
展开▼